A Dual Augmented Block Minimization Framework for Learning with Limited Memory
نویسندگان
چکیده
In past few years, several techniques have been proposed for training of linear Support Vector Machine (SVM) in limited-memory setting, where a dual blockcoordinate descent (dual-BCD) method was used to balance cost spent on I/O and computation. In this paper, we consider the more general setting of regularized Empirical Risk Minimization (ERM) when data cannot fit into memory. In particular, we generalize the existing block minimization framework based on strong duality and Augmented Lagrangian technique to achieve global convergence for general convex ERM. The block minimization framework is flexible in the sense that, given a solver working under sufficient memory, one can integrate it with the framework to obtain a solver globally convergent under limited-memory condition. We conduct experiments on L1-regularized classification and regression problems to corroborate our convergence theory and compare the proposed framework to algorithms adopted from online and distributed settings, which shows superiority of the proposed approach on data of size ten times larger than the memory capacity.
منابع مشابه
Solving Multiple-Block Separable Convex Minimization Problems Using Two-Block Alternating Direction Method of Multipliers
Abstract. In this paper, we consider solving multiple-block separable convex minimization problems using alternating direction method of multipliers (ADMM). Motivated by the fact that the existing convergence theory for ADMM is mostly limited to the two-block case, we analyze in this paper, both theoretically and numerically, a new strategy that first transforms a multiblock problem into an equ...
متن کاملThe Role of working memory capacity on the learning the relative timing a motor task: Emphasis on implicit and explicit approaches
Abstract The aim of this study was to investigate the role of working memory capacity and errorless and errorful practice on the learning the relative timing was a motor task. 50 Participants based on were selected aged 22±4 years as accessible samples randomly assigned to one of four groups (errorless low working memory capacity, errorful low working memory capacity, errorless high working me...
متن کاملA Block Successive Upper Bound Minimization Method of Multipliers for Linearly Constrained Convex Optimization
Consider the problem of minimizing the sum of a smooth convex function and a separable nonsmooth convex function subject to linear coupling constraints. Problems of this form arise in many contemporary applications including signal processing, wireless networking and smart grid provisioning. Motivated by the huge size of these applications, we propose a new class of first order primal-dual algo...
متن کاملConstrained convex minimization via model-based excessive gap
We introduce a model-based excessive gap technique to analyze first-order primaldual methods for constrained convex minimization. As a result, we construct new primal-dual methods with optimal convergence rates on the objective residual and the primal feasibility gap of their iterates separately. Through a dual smoothing and prox-function selection strategy, our framework subsumes the augmented...
متن کاملDistributed Block-diagonal Approximation Methods for Regularized Empirical Risk Minimization
Designing distributed algorithms for empirical risk minimization (ERM) has become an active research topic in recent years because of the practical need to deal with the huge volume of data. In this paper, we propose a general framework for training an ERM model via solving its dual problem in parallel over multiple machines. Our method provides a versatile approach for many large-scale machine...
متن کامل